41 research outputs found

    A Nonlinear Projection Neural Network for Solving Interval Quadratic Programming Problems and Its Stability Analysis

    Get PDF
    This paper presents a nonlinear projection neural network for solving interval quadratic programs subject to box-set constraints in engineering applications. Based on the Saddle point theorem, the equilibrium point of the proposed neural network is proved to be equivalent to the optimal solution of the interval quadratic optimization problems. By employing Lyapunov function approach, the global exponential stability of the proposed neural network is analyzed. Two illustrative examples are provided to show the feasibility and the efficiency of the proposed method in this paper

    Bipartite consensus for multi-agent networks of fractional diffusion PDEs via aperiodically intermittent boundary control

    Get PDF
    In this paper, the exponential bipartite consensus issue is investigated for multi-agent networks, whose dynamic is characterized by fractional diffusion partial differential equations (PDEs). The main contribution is that a novel exponential convergence principle is proposed for networks of fractional PDEs via aperiodically intermittent control scheme. First, under the aperiodically intermittent control strategy, an exponential convergence principle is developed for continuously differentiable function. Second, on the basis of the proposed convergence principle and the designed intermittent boundary control protocol, the exponential bipartite consensus condition is addressed in the form of linear matrix inequalities (LMIs). Compared with the existing works, the result of the exponential intermittent consensus presented in this paper is applied to the networks of PDEs. Finally, the high-speed aerospace vehicle model is applied to verify the effectiveness of the control protocol

    A Nonlinear Projection Neural Network for Solving Interval Quadratic Programming Problems and Its Stability Analysis

    Get PDF
    This paper presents a nonlinear projection neural network for solving interval quadratic programs subject to box-set constraints in engineering applications. Based on the Saddle point theorem, the equilibrium point of the proposed neural network is proved to be equivalent to the optimal solution of the interval quadratic optimization problems. By employing Lyapunov function approach, the global exponential stability of the proposed neural network is analyzed. Two illustrative examples are provided to show the feasibility and the efficiency of the proposed method in this paper

    A Nonlinear Projection Neural Network for Solving Interval Quadratic Programming Problems and Its Stability Analysis

    Get PDF
    This paper presents a nonlinear projection neural network for solving interval quadratic programs subject to box-set constraints in engineering applications. Based on the Saddle point theorem, the equilibrium point of the proposed neural network is proved to be equivalent to the optimal solution of the interval quadratic optimization problems. By employing Lyapunov function approach, the global exponential stability of the proposed neural network is analyzed. Two illustrative examples are provided to show the feasibility and the efficiency of the proposed method in this paper

    Fixed-time synchronization of semi-Markovian jumping neural networks with time-varying delays

    No full text
    Abstract This paper is concerned with the global fixed-time synchronization issue for semi-Markovian jumping neural networks with time-varying delays. A novel state-feedback controller, which includes integral terms and time-varying delay terms, is designed to realize the fixed-time synchronization goal between the drive system and the response system. By applying the Lyapunov functional approach and matrix inequality analysis technique, the fixed-time synchronization conditions are addressed in terms of linear matrix inequalities (LMIs). Finally, two numerical examples are provided to illustrate the feasibility of the proposed control scheme and the validity of theoretical results

    Global Stability Analysis for Periodic Solution in Discontinuous Neural Networks with Nonlinear Growth Activations

    Get PDF
    This paper considers a new class of additive neural networks where the neuron activations are modelled by discontinuous functions with nonlinear growth. By Leray-Schauder alternative theorem in differential inclusion theory, matrix theory, and generalized Lyapunov approach, a general result is derived which ensures the existence and global asymptotical stability of a unique periodic solution for such neural networks. The obtained results can be applied to neural networks with a broad range of activation functions assuming neither boundedness nor monotonicity, and also show that Forti's conjecture for discontinuous neural networks with nonlinear growth activations is true

    Almost Periodic Solution for Memristive Neural Networks with Time-Varying Delays

    Get PDF
    This paper is concerned with the dynamical stability analysis for almost periodic solution of memristive neural networks with time-varying delays. Under the framework of Filippov solutions, by applying the inequality analysis techniques, the existence and asymptotically almost periodic behavior of solutions are discussed. Based on the differential inclusions theory and Lyapunov functional approach, the stability issues of almost periodic solution are investigated, and a sufficient condition for the existence, uniqueness, and global exponential stability of the almost periodic solution is established. Moreover, as a special case, the condition which ensures the global exponential stability of a unique periodic solution is also presented for the considered memristive neural networks. Two examples are given to illustrate the validity of the theoretical results

    Global Stability Analysis for Periodic Solution in Discontinuous Neural Networks with Nonlinear Growth Activations

    No full text
    This paper considers a new class of additive neural networks where the neuron activations are modelled by discontinuous functions with nonlinear growth. By Leray-Schauder alternative theorem in differential inclusion theory, matrix theory, and generalized Lyapunov approach, a general result is derived which ensures the existence and global asymptotical stability of a unique periodic solution for such neural networks. The obtained results can be applied to neural networks with a broad range of activation functions assuming neither boundedness nor monotonicity, and also show that Forti's conjecture for discontinuous neural networks with nonlinear growth activations is true.</p
    corecore